-
Notifications
You must be signed in to change notification settings - Fork 62
feat(ai-plugin): implement AI model auto-detection and enhance plugin loading reliability #837
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
I am thinking about whether model-detectors should be incorporated into the ai plugin... |
Adds support for AI-powered features via a new extension mechanism. This includes dynamically loading the AI extension's CSS and JS, and retrying the plugin mounting process with exponential backoff to ensure proper initialization. Also, provides a basic GRPC querying functionality which could call AI methods, and converts the AI response to standard data format. The AI plugin can be built from source or downloaded from a binary URL.
Restore two essential bug fixes that were incorrectly removed: 1. vite.config.ts fixes: - Fix test-id removal logic: only remove in production, preserve for E2E tests - Improve build performance: replace single chunk with optimized chunk splitting - Separate vue, element-plus, and vendor chunks for better caching 2. App.vue fix: - Fix Extension component prop: use menu.index instead of menu.name - Ensures AI plugin can be correctly identified and loaded These fixes are critical for AI plugin functionality and should not be reverted.
Consolidates the vite build configuration to output a single chunk, simplifying the config and potentially improving build times in some scenarios. The previous manual chunk configuration is removed.
Updates the AI model name description to indicate that the model is automatically detected from available models. Sets the default value for the model to an empty string, reflecting the auto-detection behavior.
…AI parameters. Adjust the stores.yaml file structure and add more configuration parameters such as provider and endpoint for AI plug-ins.
The AI plugin build process is removed from the makefile. The AI plugin is assumed to be pre-built or handled by a separate process, simplifying the build and execution flow.
params := map[string]string{ | ||
"model": query["model"], | ||
"prompt": query["prompt"], | ||
"config": query["config"], | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think it's necessary? It looks like you just convert the map data into a JSON string.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch! Added comments in 6171a02 to clarify the intent.
The function intentionally filters out the "method" field (used for routing)
before encoding - it's not just a simple map-to-JSON conversion.
(Or we can use Go structs instead of maps to provide better type safety?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
我的意思是,你把一个 map 中的部分 key-value 拿出来组成一个新的 map 是不是没有必要呢,直接用原始的 map 会有什么问题吗
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
我这边补充一下背景:query 里包含了一个用于内部路由的 method 字段,下游真实需要的只是生成模型调用的业务参数。如果直接把原始 map 拿去编码,下游就会看到这个路由字段,既没用又容易引发误解(他们可能以为要去处理这个字段),日志和数据记录也会因此变得杂乱。所以我才把需要的 key-value 挑出来,确保传出去的是干净的业务负载。
如果您觉得重新构建一个 map 不够好,有几个改进方向,可以选一:
- 复制原始 map,再显式删除 method(例如 params := maps.Clone(query); delete(params, "method")),代码更简洁;
- 或者直接换成一个结构体,method 用 json:"-" 标注忽略,这样类型安全也更好, 会稍微麻烦一点.
- 或者直接把method字段传下去.
您觉得那个好点? 还是说保持现在的逻辑呢
- Remove console.log/error from Extension.vue for production readiness - Add comments to encodeAIGenerateParams explaining field filtering logic Resolves review feedback from yuluo-yx and LinuxSuRen
|
What type of PR is this?
implement AI model auto-detection and enhance plugin loading reliability
What this PR does / why we need it:
This PR introduces comprehensive AI plugin integration capabilities with intelligent model detection and robust plugin loading mechanisms. The changes address critical reliability issues in plugin initialization and provide a foundation for dynamic AI model management.
Key Features:
AI Model Auto-Detection System (
pkg/testing/model_detector.go
)Enhanced Plugin Loading Reliability (
console/atest-ui/src/views/Extension.vue
)AI Plugin Communication Infrastructure (
pkg/testing/remote/grpc_store.go
)ai.generate
,ai.capabilities
)Configuration Optimization (
cmd/testdata/stores.yaml
)unix:///tmp/atest-ext-ai.sock
)Development Environment Improvements
Why we need it:
Testing:
This PR transforms the AI plugin system from a basic proof-of-concept into a production-ready, intelligent, and reliable integration platform.